Class relationship‐based knowledge distillation for efficient human parsing

نویسندگان

چکیده

In computer vision, human parsing is challenging due to its demand for accurate region location and semantic partitioning. This dense prediction task needs powerful computation high-precision models. To enable real-time on resource-limited devices, the authors introduced a lightweight model using ResNet18 as core network . The simplified pyramid module, improving context clarity reducing complexity. integrated spatial attention fusion strategy counter precision loss in light-weighting process. Traditional models, despite their segmentation precision, are limited by computational complexity extensive parameters. implemented knowledge distillation (KD) techniques enhance authors’ network's accuracy. methods can fail learn useful with significant differences. Hence, used novel approach based inter-class intra-class relations outcomes, noticeably experiments Look into Person (LIP) dataset show that significantly reduces parameters while maintaining enhancing inference speed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Knowledge Distillation from an Ensemble of Teachers

This paper describes the effectiveness of knowledge distillation using teacher student training for building accurate and compact neural networks. We show that with knowledge distillation, information from multiple acoustic models like very deep VGG networks and Long Short-Term Memory (LSTM) models can be used to train standard convolutional neural network (CNN) acoustic models for a variety of...

متن کامل

Learning Efficient Object Detection Models with Knowledge Distillation

Despite significant accuracy improvement in convolutional neural networks (CNN) based object detectors, they often require prohibitive runtimes to process an image for real-time applications. State-of-the-art models often use very deep networks with a large number of floating point operations. Efforts such as model compression learn compact models with fewer number of parameters, but with much ...

متن کامل

Energy efficient distillation

Distillation is responsible for a significant amount of the energy consumption of the world’s process industry and also in the natural gas processing. There is a significant energy saving potential that can be obtained by applying new energy saving distillation technology that has appeared in the last two decades. The fully thermally coupled dividing wall columns have the attractive feature of ...

متن کامل

Efficient Parsing for French

Parsing with categorial grammars often leads to problems such as proliferating lexical ambiguity, spurious parses and overgeneration. This paper presents a parser for French developed on an unification based categorial grammar (FG) which avoids these problem s. This parser is a bottom-up c hart parser augmented with a heuristic eliminating spurious parses. The unicity and completeness of parsin...

متن کامل

Sequence-Level Knowledge Distillation

Neural machine translation (NMT) offers a novel alternative formulation of translation that is potentially simpler than statistical approaches. However to reach competitive performance, NMT models need to be exceedingly large. In this paper we consider applying knowledge distillation approaches (Bucila et al., 2006; Hinton et al., 2015) that have proven successful for reducing the size of neura...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronics Letters

سال: 2023

ISSN: ['0013-5194', '1350-911X']

DOI: https://doi.org/10.1049/ell2.12900